AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
High-precision F1 Evaluation

# High-precision F1 Evaluation

Xlm Roberta Base Intent Twin
MIT
XLM-RoBERTa-base is a multilingual pre-trained model based on the RoBERTa architecture, supporting Russian and English, suitable for text classification tasks.
Text Classification Transformers Supports Multiple Languages
X
forthisdream
30
1
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase